Abstract
Have you ever noticed that when "we need more time", we can
increase the number of things we do in a second, but when
"we need more space", we don't use a magnifying glass ?
Principle of Self organization
Redundancy:
by increasing H_max
by decreasing H
Increasing H_max:
Increasing 'brightness' (intensity) exposes more available states
but increases the system energy. The system is no longer closed
so, increasing H_max alters a "closed" system's behaviour.
Decreasing H:
Increasing the systems 'contrast' by decreasing H,
reduces the number of distinguishable[2] states, but
does not necessarily increase the potential energy
in the system[1].
Spatio-temporal redundancy:
Increasing H_max is analogous to increasing the number
of 'spatial' states.
Increasing H_max, increases the number of possible parallel
or simultaneous observations 'spatially' by increasing the
number of available states; thus, H is in a sense 'virtually'
reduced relative to H_max.
More intuitively, this can be modelled as increasing the number
of relative parallel observers, without affecting their individual
sample-rates (times). Each observer can be treated as a distinct
path within phase space. It is not necessary that each path
confer with the same language.
This action has side effects:
State Tunneling
Linguistic Indeterminacy
Decreasing H is analogous to increasing the number of
temporal states.
Decreasing H, decreases the number of possible states.
This is more of an absolute reduction of H and not a relative
one as in terms of increasing H_max.
Decreasing H reduces the state space to a more sequential path
through a phase sub-space which limits the possible state transitions.
We can say that "the former states were possible states and
that now they are not". That they 'were' possible admits that the
phase space is inclusive of them even though they are not used 'now'.
This action has side effects:
Wave-function collapse
Semantic Uncertainty
Linguistic Discrimination and Semantic Uncertainty
'Linguistic indeterminacy' is the admission of many
possible languages or protocols in the transmission of
information. Whereas in serial communication, this can
only be done if each observer along a single observational
path can speak the same languages as its nearest neighbors,
multiple paths through phase space allow each path to
represent one observer with its own peculiar language
and only at some point of convergence is it required
that these languages be translatable.
Because a single path through phase space is sequential
and causative, 'linguistic discrimination' is minimized
but the 'telephone effect' leads to 'semantic uncertainty'
within in smaller set of languages.
[1] In the reduction of entropy of a battery cell, the potential
energy of the battery is increased reducing the number of possible
states of the system
Mixed oil and vinegar separate in much the same sense that
we attribute to charging a battery, and yet this does not
increase the potential energy of the oil-vinegar "battery":
Oil and Vinegar
In this case it appears that a closed system's potential
energy was expended in the process of separation.
The oil and vinegar system winds up in what appears to be a
"highly organized" binary set of states at max-entropy.
This seems quite contrary to the popularized notion of
"entropy" as being a measure of disorder. Here, max-entropy
is thermodynamically disordered, yet in some sense
it is highly ordered.
Can we say the oil-vinegar max-entropy is of low Shannon entropy
and high thermodynamic entropy ?
Janus and Entropy:
"In the latter part of the nineteenth century, the eminent physicist
Maxwell defined entropy as a measure of of the disorder or ``randomness''
in matter."
http://www.cs.auckland.ac.nz/~cthombor/Pubs/Entropy/node2.html
"Entropy, the thermodynamic state function that drives reactions to
equilibrium, was defined by Rudolf Clausius in the early 1860's."
http://www.madsci.org/posts/archives/may98/894567622.Ch.r.html
"Entropy can be defined as the measure of dissorder a system contains.
One way to calculate entropy is to use a formula Ludwig Boltzman
came up with:"
http://inst.augie.edu/~dedeneva/entropy.htm
"The information theory concept of entropy in its present form was
introduced by Shannon in 1948"
http://www.nmia.com/~monsmith/diss/node21.html
"Around 1850, German scientist Rudolf Clausius first introduced the idea of entropy"
http://www.aip.org/physnews/preview/1997/qinfo/sidebar2.htm
"In all processes in closed systems entropy never decreases (Clausius-Boltzmann)."
http://www.plmsc.psu.edu/%7Ewww/matsc597c-1997/introduction/Lecture3/node1.html
"Of course, you already know that thermodynamic entropy ONLY applies
to chemicals and matter, atoms and molecules, not to economics or
information or pollution or human problems, unless they are directly
connected to atomic and molecular behavior, right?"
http://www.2ndlaw.com/entropy.html
"...there are both classical and quantum statistical mechanical
definitions of physical entropy..."
http://www.math.washington.edu/~hillman/entropy.html
"Entropy is a peer-refereed scientific journal."
http://www.mdpi.org/entropy/
http://www.mdpi.org/entropy/entropyweb/defineentropy.htm
"So what is the problem between entropy and evolution?"
http://www.mdpi.org/entropy/entropyweb/prigogine.htm
"Entropy has a bad taste..."
http://www.cpm.mmu.ac.uk/~majordom/entropy/
"The work of Prigogine in drawing attention to non-linear thermodynamics"
http://www-diotima.math.upatras.gr/mirror/autopoiesis/0158.html
"Prigogine called systems which continuously export entropy
in order to maintain their organization dissipative structures."
http://pespmc1.vub.ac.be/SELFORG.html
"K Denbigh (1981) How subjective is entropy?"
http://www.ph.ed.ac.uk/%7Eviv/maxwells-demon.html
"Since Shannon (1945), Jaynes (1957) and Kullback (1959),
the notion of entropy has widely been used in communication
theory and data and signal processing."
http://www.maxent99.idbsu.edu/
"Where do we stand on maximum entropy?"
http://bayes.wustl.edu/etj/node1.html#stand
[2] Here, 'distinguishment' is in terms of 'resolution', and not
necessarily in terms of the actual number of physically
available system states (relative entropy). Increasing
contrast on a television screen reduces it's information
entropy, but not necessarily its physical entropy. Such
a distinction between Shannon and Thermodynamic entropy
becomes less apparent as finer resolutions approach the
quasiatomic limit where information and energy are almost
synonymous.
[4] Information here means "new information". Information which is
redundant is useless as "new information" but not useless for
conveying information.
Home